On the use of auxiliary variables in Markov chain Monte Carlo sampling
نویسندگان
چکیده
We study the slice sampler, a method of constructing a reversible Markov chain with a speciied invariant distribution. Given an independence Metropolis-Hastings algorithm it is always possible to construct a slice sampler that dominates it in the Peskun sense. This means that the resulting Markov chain produces estimates with a smaller asymptotic variance. Furthermore the slice sampler has a smaller second-largest eigenvalue than the corresponding independence Metropolis-Hastings algorithm. This ensures faster convergence to the distribution of interest. A suucient condition for uniform er-godicity of the slice sampler is given and an upper bound for the rate of convergence to stationarity is provided.
منابع مشابه
Gibbs Sampling for Bayesian Non-Conjugate and Hierarchical Models by Using Auxiliary Variables
We demonstrate the use of auxiliary (or latent) variables for sampling non-standard densities which arise in the context of the Bayesian analysis of non-conjugate and hierarchical models by using a Gibbs sampler. Their strategic use can result in a Gibbs sampler having easily sampled full conditionals. We propose such a procedure to simplify or speed up the Markov chain Monte Carlo algorithm. T...
متن کاملEfficient Sampling for Gaussian Process Inference using Control Variables
Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, t...
متن کاملHamming Ball Auxiliary Sampling for Factorial Hidden Markov Models
We introduce a novel sampling algorithm for Markov chain Monte Carlo-based Bayesian inference for factorial hidden Markov models. This algorithm is based on an auxiliary variable construction that restricts the model space allowing iterative exploration in polynomial time. The sampling approach overcomes limitations with common conditional Gibbs samplers that use asymmetric updates and become e...
متن کاملMarkov Chain Monte Carlo Methods for Statistical Inference
These notes provide an introduction to Markov chain Monte Carlo methods and their applications to both Bayesian and frequentist statistical inference. Such methods have revolutionized what can be achieved computationally, especially in the Bayesian paradigm. The account begins by discussing ordinary Monte Carlo methods: these have the same goals as the Markov chain versions but can only rarely ...
متن کاملSequential Monte Carlo for Graphical Models
We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM). Via a sequential decomposition of the PGM we find a sequence of auxiliary distributions defined on a monotonically increasing sequence of probability spaces. By targeting these auxiliary distributions using SMC we are able to approximate the full joint distrib...
متن کامل